16,441 research outputs found

    Convergence of Langevin MCMC in KL-divergence

    Full text link
    Langevin diffusion is a commonly used tool for sampling from a given distribution. In this work, we establish that when the target density pβˆ—p^* is such that log⁑pβˆ—\log p^* is LL smooth and mm strongly convex, discrete Langevin diffusion produces a distribution pp with KL(p∣∣pβˆ—)≀ϡKL(p||p^*)\leq \epsilon in O~(dΟ΅)\tilde{O}(\frac{d}{\epsilon}) steps, where dd is the dimension of the sample space. We also study the convergence rate when the strong-convexity assumption is absent. By considering the Langevin diffusion as a gradient flow in the space of probability distributions, we obtain an elegant analysis that applies to the stronger property of convergence in KL-divergence and gives a conceptually simpler proof of the best-known convergence results in weaker metrics
    • …
    corecore